Silicon Valley Fairy Dust

by Sherry Turkle on November 6, 2023

Silicon Valley companies began life with the Fairy dust of 1960s dreams sprinkled on them. The revolution that 1960s activists dreamed of had failed, but the personal computer movement carried that dream onto the early personal computer industry. Hobbyist fairs, a communitarian language, and the very place of their birth encouraged this fantasy. Nevertheless, it soon became clear that, like all companies, what these companies wanted most of all, was to make money. Not to foster democracy, not to foster community and new thinking, but to make money.

Making money with digital tools in neoliberal capitalism led to four practices that constituted a baseline ideology-in-practice.

1: The scraping and selling of user data. As users became accustomed to this, accepting it as the cost of online participation, the idea of privacy changed its meaning. The idea of living in a state of continual surveillance became normalized. As Foucault taught us, with this kind of change, the idea of personhood changed as well: intimacy, privacy, and democracy are woven together in an intricate connection.

2: The normalization of lying to the public while wearing a public face of moral high-mindedness. In 2021, when Facebook wanted to start an Instagram for under-thirteen-year-olds, it hid its internal research about how teenage girls felt after beginning to use Instagram. The girls said they had more suicidal thoughts, their eating disorders got worse, and they felt worse about their bodies. But Facebook was going to proceed with its under-thirteen Instagram until a whistleblower brought all this to light. The lack of commitment to truth in Silicon Valley companies is politically crucial because they are in a unique position to routinely dispense disinformation as information.

3: Silicon Valley companies that have user-facing platforms want, most of all, to keep people at their screens. Early on, they discovered that a good formula was to make users angry and then keep them with their own kind, When they were siloed, people could be stirred up into being even angrier at those with whom they disagreed. Predictably, this formula undermined the conversational attitudes that nurture democracy — above all, attitudes of tolerant listening. Digital manipulation undermines and then destroys the very possibility of conversation.

4: Avatars have politics. Online conversations make people feel less vulnerable than the face-to-face kind. As engagement at a remove has become a social norm, it has become more acceptable to stop taking the time to meet in person, even in professions where conversations was highly valued, such as teaching, consulting, and psychotherapy. In remote classrooms and meetings, in conversations-by-text, it’s easy to lose skills of listening, especially listening to people who don’t share your opinions. Democracy works best if you can talk across differences. It works best if you slow down to hear someone else’s point of view. We need these skills to reclaim our communities, our democracy, and our shared common purpose. In today’s political climate, we most need the political skills that screen objects erode.

Silicon Valley suggests that technology will cure social problems, but it exacerbates the social problems it claims its connectivity will cure. Facebook claims to be our cure for loneliness, but online, we became alone together, less able to find a common cause.

Fewer and fewer Americans know even one other person they could call in time of emergency. We suffer an epidemic of loneliness, even as we live immersed in our technologies of hyperconnection. This loneliness is at the heart of growing rates of depression, drug abuse, and suicide, If Americans could learn to turn toward each other in the real and to act together to save their communities, that would help us save ourselves. Silicon Valley ideology insists that virtual organizing will translate into real-world connection, but online life puts you into habits of mind that don’t make this translation easy.  Ultimately, this has to do with tolerating and even embracing friction.  In this context, Silicon Valley proposes a meta-object, the metaverse, that threatens to put us into precisely all the wrong habits of mind.

In real life, things go awry. We need to tolerate each other’s differences. Virtual reality is friction-free. The dissidents are removed from the system. People get used to that, and real life seems intimidating. Maybe that’s why so many internet pioneers are tempted by going to space or the metaverse. That sense of a clean slate. In real life, there is history.

Silicon Valley ideology wants to deny history because we might object to what is being done to our own. When we are online, our lives are bought and sold in bits and pieces.

From early on, pointing out this harm was most often met with a shrug. It was the cost of having social media “for free,” then of having Gmail “for free.”  In the early years of Facebook, one young woman told me she wasn’t much concerned that Facebook was looking at her data. She said: “Who would care about me and my little life?”

“Who would care about me and my little life?”—  Well, Facebook did. Social media evolved to sell our private information in ways that fractured both our intimacy and our democracy.

But even after so many people knew this, conversations about this, such as conversations about climate change, tried to not talk about its reality.

Here is how Lana, who just graduated from college, talked about how she organizes herself to not think about the realities of online privacy:

On Facebook, I try to keep it light. So I don’t use Facebook for conversations of real consequence. And I’m glad not to have anything controversial on my mind because I can’t think of any online place where it would be safe to have controversial conversations.

Now, in fact, Lana had no lack of controversial opinions. But we can hear her convincing herself that they are not worth expressing because her medium would be online, and there is no way to talk “safely” there. This is Foucault brought down to earth. The politics of Facebook is a politics of tutelage in forgetting. Lana is learning to be a citizen in an authoritarian regime.

Lana says she’ll worry about online privacy “if something bad happens.” But something bad has already happened. She has learned to self-censor. She does not see herself as someone with a voice. In this small example, we see how our narrowed sense of privacy undermines the habits of thought that nurture democracy.

The former chairman of Google once said that if you’re worried about privacy, don’t be a Luddite, “Just be good.” In a democracy, we all need to begin with the assumption that everyone has something to hide, a zone of private action and reflection, a zone that needs to be protected no matter what your techno-enthusiasms. You need space for real dissent. A mental space and a technical space. It’s a private space where people are free to “not be good.”

This conversation about technology, privacy, and democracy is not Luddite, and it is never too late to remember to have it.

{ 25 comments }

1

Just an Australian 11.06.23 at 10:00 am

But we can hear her convincing herself that they are not worth expressing because her medium would be online, and there is no way to talk “safely” there.

with respect, she’s not afraid of Facebook and their monetising ways. She’s afraid of her peers, or their peers. And future employers.

We have met the devil, and he is us.

2

Matthew Heath 11.06.23 at 10:02 am

“This conversation about technology, privacy, and democracy is not Luddite, and it is never too late to remember to have it.”
Some parts of the converstaion (e.g. on Tech Won’t Save Us and This Machine Kills) are Luddite and that is Good because the Luddites were Good

3

MisterMr 11.06.23 at 11:37 am

People self-censor face to face a lot though, when they are speaking with people who might be offended by their opinion.

In fact, the problem might be the opposite, that people on the net group with people with similar ideas so they don’t need to self censor, but this creates echo chambers.

4

Mike Huben 11.06.23 at 11:42 am

This is excellent.

Do privacy laws in the EU affect this problem?

5

CP Norris 11.06.23 at 2:21 pm

Early on, they discovered that a good formula was to make users angry and then keep them with their own kind, When they were siloed, people could be stirred up into being even angrier at those with whom they disagreed. Predictably, this formula undermined the conversational attitudes that nurture democracy — above all, attitudes of tolerant listening.

To some extent this is a positive shift away from the old network news model of “one side says X and the other side says Y, and far be it from us to determine the truth”. It was tolerant in some superficial sense but it enabled worse intolerance.

And I can’t be the only one here old enough to remember 1990s AM talk radio.

6

Lynne 11.06.23 at 4:17 pm

Sherry Turkle, I am delighted to see you posting here! Only a week ago I recommended your book, Alone Together, to a friend.

I did not know your point 2. about Facebook. I think this must be a challenging time to raise children. I raised mine just before all kids had smartphones, and we had enough battles over video game time without the challenge of online social media.

I will be reading the comments with interest.

7

Alex SL 11.06.23 at 10:32 pm

I do not doubt that the algorithms and incentive structures of social media have an impact on how we feel and behave. But it should be noted that most of the outcomes discussed here, like political polarisation and self-censorship, have happened before social media and would happen to at least some degree without them. The Nazis did not rise to power because they were good at manipulating Facebook, for example, but because too many people in that time and place were receptive to ideas like national grievance and a glorious natural state being corrupted by modernism. Even if social media didn’t exist, I would not consider it wise to share everything I think with random strangers or even only my colleagues at work. I am an atheist, and I don’t need to have an argument with a random co-worker who is religious when we could just get on with our lives in peace. That’s not being cowed by the dynamics of social media, it is being a sensible person.

Which leads to the other point: with the caveat that we are more vulnerable the younger and the less experienced we are, we do have agency, especially as alleged adults who claim to deserve the vote at elections and the ability to enter into contracts with other such adults. We could just, you know, not fall prey to the polarisation and bubble-formation of Facebook and the likes, if we wanted. A mental model of humans as sheeple at the mercy of, e.g., Zuckerberg and Bannon and is self-defeating, because, if that is how humans are, how is it that Zuckerberg and Bannon have agency? Or if they have agency, why not the other eight billion of us? The problem is that lots of people prefer being emotionally riled up and hateful over being rational and kind.

The one aspect I fully agree with is how odd it is that people have become so accepting of total surveillance – as long as it is done by an enormous corporation that exploits their data to make profit instead of by a government that could at least claim to use them to keep us safe, whether we believe that or not. I have to accept the reality of constant total surveillance because I couldn’t function in my work and life otherwise, but I do not understand why there isn’t at least broader concern about it, even if it were only futile grumbling.

8

JustSomeRodent 11.06.23 at 10:50 pm

It’s easy to wax nostalgic on the civility of the analog era and the “lost art of listening” if you’re confident and articulate in Real Space. Those of us who aren’t may have, at least, a more nuanced take on the “politics of avatars”.

9

J-D 11.07.23 at 3:02 am

Lana says she’ll worry about online privacy “if something bad happens.” But something bad has already happened. She has learned to self-censor.

It’s not a bad thing when people learn to self-censor. It’s a good thing when people learn to self-censor. Everybody should learn to self-censor, and nearly everybody does.

10

roger gathmann 11.07.23 at 7:51 am

Interesting question, this of loneliness. There’s a Robert Frost line in a poem about going into a woods: “The loneliness includes me unawares.” That is sorta the American spirit, the dark side of an individualist credo that is at the base of doing nothing about the stripping away of privacy, since to do something would require communal action – it would require solidarity. The American loneliness pre-dates the internet, I think – the title of that fifties best seller, the Lonely Crowd, tells a story. Is it possible that the sixties pixie dust comes out of a sense of lost community? Lost, actually, systematically – the racist foundations of the American republic, the violence of ethnic cleaning, and the imposition of an impossible and alienating capitalist ethic might have found its metastasized correlative object in social media. On the other hand, social media is not just American, and not just Silicon Valley. It plays in China, India, Tunisia – everywhere. Maybe this is what globalizing solitude as our only alternative looks like.

11

Mike Huben 11.07.23 at 11:18 am

J-D @ 9:

That’s a really good point too. I’ll never forget in my Catholic upbringing (around age 8) when I realized that I couldn’t remember my sins properly for confession. So I mentioned to my mother that I should keep a written list of them, and saw her horrified reaction “Don’t do that!”

And my father’s experience being raised as a Hitler Youth in Austria, expected to report disloyalty even by his own parents (before he emigrated to the USA and later served in the US Army.)

Self censorship is a useful survival skill. What’s important is that we have institutions that have no or low penalties for revealing our preferences. Such as the secret ballot. Such as personal security from attack. Self-revellations on the internet or recorded in the media are mostly permanently available: it might not matter as much if we had a safer society.

12

mw 11.07.23 at 12:13 pm

Early on, they discovered that a good formula was to make users angry and then keep them with their own kind, When they were siloed, people could be stirred up into being even angrier at those with whom they disagreed.

But this is also the business model of pre-20th century stridently partisan newspapers and again 21st-century media of all kinds (up to and including late night talk show hosts). SV certainly didn’t invent this. What the Internet did do was to destroy the business model of ‘objective’ 20th century news organizations, by wiping out their revenue from classified ads (Craigslist) and advertising generally. The ad-supported approach no longer works on its own, and even-handed coverage doesn’t bring in or satisfy the true believers who voluntarily pay subscription fees as long as they’re provided the kind of ‘red meat’ partisan coverage they desire.

13

Hugh Mann 11.07.23 at 12:21 pm

Like Lynne, I’m extremely thankful that our kids were raised just before social media/smartphones became everything.

OTOH my kids seem perfectly happy with the notion that everything they do online is visible to our security overlords…

14

Psychoceramicist 11.08.23 at 7:59 pm

Silicon Valley ideology is, I think, much older than it thinks it is. The Wired magazine/Homebrew Computer Club ideology came mostly from Stewart Brand and his Whole Earth Catalog, whose thinking derived largely from Buckminster Fuller, whose own thinking was a mixture of New England transcendentalism and 20th century technocracy. Transcendentalism ultimately came out of the secularization of the Puritans, and before that Puritanism was an offshoot of the Calvinist/Reformed Protestant traditions in Europe. There’s a direct line between Calvin and his gaggle of Reformed preachers in Geneva and TED talkers in the modern Bay Area.

In my opinion this is a specific ideology out of early modern Europe, and if it has an ur-format it’s something like “there is a faith/ideology that we can adopt that can free us from the messy, contradictory, and/or corrupt institutions that are currently constraining us, there is a technology (like Bible reading or computation) that can enable us to achieve that faith/ideology, and from there follows salvation/liberation”. As a side note, for various reasons this ideology has always paired very well with whatever kind of capitalism was prevailing where it took root.

Along with all communitarian ideologies it fails over time because essential human difference – in interests, perspectives, feelings, personalities – can’t be overcome, and organized institutions like those that exist in modern, bureaucratic, and pluralistic liberal democracy are needed to keep people living together with minimal violence and oppression.

15

Psychoceramicist 11.08.23 at 8:17 pm

@ 9 and 13

I think in younger generations (speaking as an aging but not yet elder Millennial that graduated from high school in 2007, which as far as I can tell was the last year that adolescent digital socializing was mostly nonexistent except for certain subcultures) it’s less that that kids are “perfectly happy” and more that they believe the war for privacy has been irrevocably lost. Older generations gave it away, and just like the warmer climate it’s part of the air they breathe.

16

Phil Wolff 11.09.23 at 8:15 pm

Making money with digital tools in neoliberal capitalism led to four practices that constituted a baseline ideology-in-practice.

The observation seems correct, that drivers and corporate cultures and regulatory capture and… all led to this dysfunctional outcome. What level of analysis and detail would let us craft alternatives that would be less painful?

17

Phil H 11.10.23 at 11:38 am

I think it’s time to start taking a firm stand against the nonsensical, self-contradictory term “self-censorship.”
Censorship is when someone else tells you what you can or cannot say (more exactly, publish). The very notion that there is some “self” separate from me, who can impose rules on my behavior, is schizophrenic (in the pop-culture sense, not in any real medical sense). It’s effectively denying that I have the right as a person to decide what I want to say. The very term seems to suggest that if I have a thought, I must desire to express it (publish it, in fact); and that any psychological process that filters or stops that expression is “censorship.” I find this to be infantilising.
Turkle says: “ But something bad has already happened. She has learned to self-censor.” Let’s assume that she’s right about the fact of Lana’s behaviour and thoughts. What she is in fact describing is the process of socialisation, where we learn that it’s perfectly possible to hold opinions and not express them.
Not having a filter is not a good thing. It’s Trump.

18

Christian Matschke 11.12.23 at 1:16 am

Data equal profits. That is why the digital economy does everything to extract user data and at the same time try to convince us that the benefit is ultimately ours. I have a feeling the situation will only change once there is proof, or clear indications at least, that more data does in fact NOT lead to a more efficient allocation of capital.

19

Max Kilger 11.12.23 at 4:48 am

Well written piece of analysis. When I was a grad student at Stanford during the beginnings of the personal computer revolution I spent a fair amount of time in field observation of the early hacking community. The “Just be good” admonishment was in fact a faint echo of the early hacking community’s philosophy and by that time it was merely an echo in an empty room. The mostly altruistic motivations of this community joined the monetary motivations of Silicon Valley in terms of unleashing technology and technology use with an unknowing unidimensional perspective that was largely oblivious to the social and psychological changes it would wreak among relationships between people and machines as well as relationships between people – some of them positive but many of them carrying negative aspects. Now we are entering an even more unknown epoch with AI where the relationships between people and machines starts to glow in the opposite direction direction – from people to machines. While there is most definitely a much larger community of concerned scientists looking at this newly emerging phenomenon, there are likely much more powerful forces at work here that we know even less about. It gives us in the social sciences community much concern.

20

David in Tokyo 11.12.23 at 6:37 am

Hmm. I’m more worried about enshitification than privacy. To a certain extent, the internet is like a public CV: you put on it what you want. If you are not careful, it can be problematic, and there probably need to be ways to protect folks, but I can’t, offhand, see any way to do that, so keep at it.

(There’s a fine line between putting on a good face and self-censoring; see “@djl” on Mastodon for a bad example.)

Twitter was an important means of communication before Elon messed it up, and we still haven’t figured out what to do about it. Ditto for YouTube, although it’s not been so completely trashed yet: they’re tired of not being able to interrupt your concert video with ads, and are in a whack-a-mole game with the ad blockers: I’m worried the ad blockers will lose.

(When an ad comes up, I close the window and look for something else to watch or do. The folks who get specific sponsors put the ads into the videos, and they can be fast-forwarded over, and appear in breaks in the content.)

(I don’t understand ads: they can’t possibly work. When I spend a lot of time looking for something, it’s because I’m not finding it, so if the “algorithms” see all the time I’m looking for a tripod, they assume I’m still looking; but I’ve given up. But if I search for something and find it, I don’t need it any more.)

Whatever, I’m a bad one to listen to because I never used Twitter or Facebook (or instagram or tictok or spotify or ebay or paypal or…) and only signed up for Mastodon in the hope it’d piss off Elon.

21

B. Schmaling 11.12.23 at 1:55 pm

Next up, the attention of the angry and siloed will now be harvested by new “empathetic” personal AIs to guide them through life.

22

J-D 11.12.23 at 10:14 pm

I don’t understand ads: they can’t possibly work.

People who make ads get paid to make them, so the business model is working for them, at least.

23

Alex SL 11.13.23 at 8:48 pm

Even beyond algorithms on social media, I do not understand ads and how they are supposed to work. Mostly I am annoyed by them, and because many are extremely obnoxious and insult the intelligence of the viewer/listener, I am more likely to avoid a product if I see it advertised. Very rarely I see a non-obnoxious ad. Even then, what never happens is that I decide to buy a product or service because I have seen an ad. I buy it because there is no alternative, because it is the same thing I always buy and it hasn’t failed me so far, or because a person I trust recommended it. I am certainly not going to start “trading crypto” just because some dude with a super-macho voice pops up on Youtube and tells me to, because, sorry, despite all my faults I am not that stupid.

There are IMO two possible explanations. Either ads work on enough people who are very different from me to make the investment worthwhile, or the ad business is to a large degree a fraud perpetrated on the people paying for ads. The latter is probably more relevant to online ads, at least, because everybody has a stack of anecdotes about how the algorithms are downright counter-productive.

24

gmoke 11.13.23 at 9:36 pm

Good to run across you here, Mike. As for self-censorship, I agree with your point that self-censorship is often simply politeness, consideration, or just good common sense. However, I think Sherry Turkle is leaning more towards the self-censorship described by Ursula K LeGuin in her essay “The Stalin in the Soul” which is much more pernicious and dangerous.

I remember some friends who worked in China at the end of the 1970s and early 1980s explaining how Chinese parenting worked. The parent would not chastise a child for reaching for something the parent didn’t want them to have. They would simply take the reaching hand and put it back in the child’s lap, time after time, again and again, until, eventually, the child doesn’t bother to reach at all.

25

J-D 11.15.23 at 5:14 am

Good to run across you here, Mike. As for self-censorship, I agree with your point that self-censorship is often simply politeness, consideration, or just good common sense. However, I think Sherry Turkle is leaning more towards the self-censorship described by Ursula K LeGuin in her essay “The Stalin in the Soul” which is much more pernicious and dangerous.

Sometimes people hold back from saying things which it would have been a bad choice to have said, and that’s good; sometimes people hold back from saying things which it would have been a good choice to have said, and that’s bad. Just the other day I held back from saying ‘I find you tedious and would prefer to avoid your company if I can.’ It would have been a bad choice for me to say that (even though it was true), and it’s a good thing that I didn’t.

If not learning to self-censor means not learning how to choose to refrain from expressing some of your thoughts and feelings, then it’s a bad thing; if that’s what it means, then everybody should learn to self-censor. This position is entirely consistent with supposing that sometimes people make bad use of the ability to self-censor. I imagine that Ursula Le Guin, in the article mentioned, was referring to specific examples of people making bad use of their ability to self-censor.

I remember some friends who worked in China at the end of the 1970s and early 1980s explaining how Chinese parenting worked. The parent would not chastise a child for reaching for something the parent didn’t want them to have. They would simply take the reaching hand and put it back in the child’s lap, time after time, again and again, until, eventually, the child doesn’t bother to reach at all.

Whether that was a good thing or a bad thing depends on why the children were reaching for the objects and why the parents didn’t want the children to have the desired objects.

Comments on this entry are closed.